# 768-dimensional Dense Vector
Finance Embeddings Investopedia
This is the Investopedia embedding model developed by the FinLang team for financial applications. It is fine-tuned based on BAAI/bge-base-en-v1.5 and maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for semantic search and other tasks in the financial domain.
Text Embedding
F
FinLang
21.25k
32
Finetuning Bm25 Small
This is a sentence similarity calculation model based on sentence-transformers, capable of mapping text to a 768-dimensional vector space
Text Embedding
F
jhsmith
15
0
S DagoBERT STSb
This is a sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation, semantic search, and clustering.
Text Embedding
Transformers

S
jpostma
13
0
Transformer
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation, clustering, and semantic search.
Text Embedding
Transformers

T
kpourdeilami
44
0
Keyphrase Mpnet V1
A sentence transformer model optimized for phrases, mapping phrases into a 768-dimensional dense vector space, suitable for tasks like clustering or semantic search.
Text Embedding
Transformers

K
uclanlp
4,278
2
Contriever Gpl Hotpotqa
This is a model based on sentence-transformers that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as sentence similarity calculation and semantic search.
Text Embedding
Transformers

C
income
13
0
Bertin Roberta Base Finetuning Esnli
Spanish sentence embedding model based on BERTIN RoBERTa, optimized for natural language inference tasks
Text Embedding Spanish
B
somosnlp-hackathon-2022
103
7
Featured Recommended AI Models